Extreme entropy machines: robust information theoretic classification
نویسندگان
چکیده
منابع مشابه
Extreme learning machines for Internet traffic classification
Network packet transport services (namely the Internet) are subject to significant security issues. This paper aims to apply Machine Learning methods based on Neural Networks (Extreme Learning Machines or ELM) to analyze the Internet traffic in order to detect specific malicious activities. This is performed by classifying traffic for a key service run over the internet: the Domain Name System ...
متن کاملMetagenomic Taxonomic Classification Using Extreme Learning Machines
Next-generation sequencing technologies have allowed researchers to determine the collective genomes of microbial communities co-existing within diverse ecological environments. Varying species abundance, length and complexities within different communities, coupled with discovery of new species makes the problem of taxonomic assignment to short DNA sequence reads extremely challenging. We have...
متن کاملAn Information-Theoretic Discussion of Convolutional Bottleneck Features for Robust Speech Recognition
Convolutional Neural Networks (CNNs) have been shown their performance in speech recognition systems for extracting features, and also acoustic modeling. In addition, CNNs have been used for robust speech recognition and competitive results have been reported. Convolutive Bottleneck Network (CBN) is a kind of CNNs which has a bottleneck layer among its fully connected layers. The bottleneck fea...
متن کاملInformation Theoretic Interpretations for H∞ Entropy
Based on the studies on information transmission in discrete multivariable linear time invariant (LTI) system disturbed by stationary noise, relations within entropy rate, mutual information rate and H∞ entropy are discussed in both general control problem and classic tracking problem. For the general control systems, equivalent relations between entropy rate and H ∞ entropy are formulated by u...
متن کاملInformation-Theoretic Learning Using Renyi’s Quadratic Entropy
Learning from examples has been traditionally based on correlation or on the mean square error (MSE) criterion, in spite of the fact that learning is intrinsically related with the extraction of information from examples. The problem is that Shannon’s Information Entropy, which has a sound theoretical foundation, is not easy to implement in a learning from examples scenario. In this paper, Reny...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Analysis and Applications
سال: 2015
ISSN: 1433-7541,1433-755X
DOI: 10.1007/s10044-015-0497-8